Variational Inference for Latent Variable Modelling of Correlation Structure

نویسندگان

  • Mark van der Wilk
  • Andrew G. Wilson
چکیده

Latent variable models have played an important part in unsupervised learning, where the goal is to capture the structure of some complicated observed data in a set of variables that are somehow simpler. PCA or Factor Analysis, for example, models high dimensional data using lower dimensional and uncorrelated latent variables. The value of the latent variable represents some underlying unobserved explanation of the observation. Often, the latent variables can be interpreted as having some kind of meaning, which is useful for exploratory data analysis. Many models follow the same principle of explaining the value of an observation by mapping from a “simple” latent space to complicated observations. Much work has been done to increase the flexibility of these mappings. The GPLVM (Lawrence and Hyvrinen, 2005), for example, uses a non-linear function with a Gaussian process prior to map from the latent space.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints

Gaussian process latent variable models (GPLVMs) are a probabilistic approach to modelling data that employs Gaussian process mapping from latent variables to observations. This paper revisits a recently proposed variational inference technique for GPLVMs and methodologically analyses the optimality and different parameterisations of the variational approximation. We investigate a structured va...

متن کامل

Black Box Variational Inference for State Space Models

Latent variable time-series models are among the most heavily used tools from machine learning and applied statistics. These models have the advantage of learning latent structure both from noisy observations and from the temporal ordering in the data, where it is assumed that meaningful correlation structure exists across time. A few highly-structured models, such as the linear dynamical syste...

متن کامل

Neural Variational Inference for Text Processing

Recent advances in neural variational inference have spawned a renaissance in deep latent variable models. In this paper we introduce a generic variational inference framework for generative and conditional models of text. While traditional variational methods derive an analytic approximation for the intractable distributions over latent variables, here we construct an inference network conditi...

متن کامل

Neural Variational Inference and Learning in Belief Networks

•We introduce a simple, efficient, and general method for training directed latent variable models. – Can handle both discrete and continuous latent variables. – Easy to apply – requires no model-specific derivations. •Key idea: Train an auxiliary neural network to perform inference in the model of interest by optimizing the variational bound. – Was considered before for Helmholtz machines and ...

متن کامل

A Mutually-Dependent Hadamard Kernel for Modelling Latent Variable Couplings

We introduce a novel kernel that models input-dependent couplings across multiple latent processes. The pairwise joint kernel measures covariance along inputs and across different latent signals in a mutually-dependent fashion. A latent correlation Gaussian process (LCGP) model combines these non-stationary latent components into multiple outputs by an input-dependent mixing matrix. Probit clas...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014